Skip to content

fix: include raw LLM response in JSON parse error messages#1090

Open
octo-patch wants to merge 1 commit intoItzCrazyKns:masterfrom
octo-patch:fix/issue-997-include-raw-response-in-parse-errors
Open

fix: include raw LLM response in JSON parse error messages#1090
octo-patch wants to merge 1 commit intoItzCrazyKns:masterfrom
octo-patch:fix/issue-997-include-raw-response-in-parse-errors

Conversation

@octo-patch
Copy link
Copy Markdown

@octo-patch octo-patch commented Apr 4, 2026

Fixes #997

Problem

When LLM responses fail JSON parsing or schema validation, the error message only shows the parse error but not the raw response from the model. This makes it nearly impossible to debug malformed outputs without modifying the underlying model server code to capture responses separately.

Solution

Include the raw response content in the error message for generateObject and streamObject methods in the OpenAI and Ollama providers (Groq inherits from OpenAI, so it is covered too).

Before:

Error: Error parsing response from OpenAI: [{"expected":"object","code":"invalid_type",...}]

After:

Error: Error parsing response from OpenAI: [{"expected":"object","code":"invalid_type",...}]
Raw response: {"key": "unexpected_value", ...}

Testing

Manually verified the error message format change in both providers. No behavioral change — only the error message content is extended.


Summary by cubic

Adds the raw LLM response to JSON parse and schema validation error messages to make debugging malformed outputs easier. Applies to OpenAI and Ollama providers; Groq inherits the change. Fixes #997.

  • Bug Fixes
    • OpenAI: append the raw message content in generateObject and the chunk text in streamObject when parsing fails.
    • Ollama: append the raw message content in generateObject when parsing fails.
    • No functional changes; only fuller error text.

Written for commit bcee37a. Summary will update on new commits.

When parsing LLM responses fails with a JSON schema validation or parse
error, the raw response content is now included in the error message.
This makes it much easier to debug malformed outputs from models that
don't strictly follow the expected JSON format (fixes ItzCrazyKns#997).
Copy link
Copy Markdown
Contributor

@cubic-dev-ai cubic-dev-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No issues found across 2 files

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Errors don't contain logs for debugging - how to debug parsing errors?

1 participant